Improved matrix algorithms via the Subsampled Randomized Hadamard Transform
نویسندگان
چکیده
Several recent randomized linear algebra algorithms rely upon fast dimension reduction methods. A popular choice is the Subsampled Randomized Hadamard Transform (SRHT). In this article, we address the efficacy, in the Frobenius and spectral norms, of an SRHT-based low-rank matrix approximation technique introduced by Woolfe, Liberty, Rohklin, and Tygert. We establish a slightly better Frobenius norm error bound than currently available, and a much sharper spectral norm error bound (in the presence of reasonable decay of the singular values). Along the way, we produce several results on matrix operations with SRHTs (such as approximate matrix multiplication) that may be of independent interest. Our approach builds upon Tropp’s in “Improved analysis of the Subsampled Randomized Hadamard Transform”.
منابع مشابه
Improved Low-rank Matrix Decompositions via the Subsampled Randomized Hadamard Transform
We comment on two randomized algorithms for constructing low-rank matrix decompositions. Both algorithms employ the Subsampled Randomized Hadamard Transform [14]. The first algorithm appeared recently in [9]; here, we provide a novel analysis that significantly improves the approximation bound obtained in [9]. A preliminary version of the second algorithm appeared in [7]; here, we present a mil...
متن کاملFaster Ridge Regression via the Subsampled Randomized Hadamard Transform
We propose a fast algorithm for ridge regression when the number of features is much larger than the number of observations (p n). The standard way to solve ridge regression in this setting works in the dual space and gives a running time of O(np). Our algorithm Subsampled Randomized Hadamard TransformDual Ridge Regression (SRHT-DRR) runs in time O(np log(n)) and works by preconditioning the de...
متن کاملImproved Analysis of the subsampled Randomized Hadamard Transform
This paper presents an improved analysis of a structured dimension-reduction map called the subsampled randomized Hadamard transform. This argument demonstrates that the map preserves the Euclidean geometry of an entire subspace of vectors. The new proof is much simpler than previous approaches, and it offers—for the first time—optimal constants in the estimate on the number of dimensions requi...
متن کاملRADAGRAD: Random Projections for Adaptive Stochastic Optimization
We present RADAGRAD a simple and computationally efficient approximation to full-matrix ADAGRAD based on dimensionality reduction using the subsampled randomized Hadamard transform. RADAGRAD is able to capture correlations in the gradients and achieves a similar regret – in theory and empirically – to fullmatrix ADAGRAD but at a computational cost comparable to the diagonal variant.
متن کاملFast Regression with an `∞ Guarantee∗
Sketching has emerged as a powerful technique for speeding up problems in numerical linear algebra, such as regression. In the overconstrained regression problem, one is given an n × d matrix A, with n d, as well as an n × 1 vector b, and one wants to find a vector x̂ so as to minimize the residual error ‖Ax− b‖2. Using the sketch and solve paradigm, one first computes S · A and S · b for a rand...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید
ثبت ناماگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید
ورودعنوان ژورنال:
- SIAM J. Matrix Analysis Applications
دوره 34 شماره
صفحات -
تاریخ انتشار 2013